XClose

Gatsby Computational Neuroscience Unit

Home
Menu

Joel Zylberberg

 

Wednesday 10th April 2019

 

Time: 4.00pm

 

Ground Floor Seminar Room

25 Howland Street, London, W1T 4JG

 

(Learning) Visual Representations

Visual stimuli elicit action potentials in the retina, that propagate to the brain, where further action potentials are elicited. What is the language of this signalling? In other words, how do patterns of action potentials in each neural circuit correspond to stimuli in the outside world? The first part of this talk will highlight recent work from my laboratory that confronts this problem in the retina and visual cortex. Next, I will discuss on-going work that asks how those representations are learned. Specifically, I will highlight a joint theory-experiment research program that investigates whether and how the brain's visual neural circuits implement the same kinds of learning algorithms as are found in modern artificial intelligence systems.

Bio:

Joel Zylberberg obtained his Ph.D. in Physics from the University of California, Berkeley. Since starting his first faculty position at the University of Colorado in 2015, he has been the recipient of the Sloan Research Fellowship in Neuroscience, the Google Faculty Research Award in Computational Neuroscience, and the Canadian Institute for Advanced Research (CIFAR) Azrieli Global Scholar Award for Learning in Machines and Brains. In 2019, Zylberberg moved to York University, where he is currently Assistant Professor of Physics, and Canada Research Chair in Computational Neuroscience.